Approximating Hessians in unconstrained optimization arising from discretized problems
نویسندگان
چکیده
We consider Hessian approximation schemes for large-scale unconstrained optimization in the context of discretized problems. The considered Hessians typically present a nontrivial sparsity and partial separability structure. This allows iterative quasi-Newton methods to solve them despite of their size. Structured finitedifference methods and updating schemes based on the secant equation are presented and compared numerically inside the multilevel trust-region algorithm proposed by Gratton, Mouffe, Toint, and Weber-Mendoça (IMA J. Numer. Anal. 28(4), 2008).
منابع مشابه
Solving the Unconstrained Optimization Problems Using the Combination of Nonmonotone Trust Region Algorithm and Filter Technique
In this paper, we propose a new nonmonotone adaptive trust region method for solving unconstrained optimization problems that is equipped with the filter technique. In the proposed method, the various nonmonotone technique is used. Using this technique, the algorithm can advantage from nonmonotone properties and it can increase the rate of solving the problems. Also, the filter that is used in...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملInterior Methods For a Class of Elliptic Variational Inequalities
We consider the application of primal-dual interior methods to the optimization of systems arising in the finite-element discretization of a class of elliptic variational inequalities. These problems lead to very large (possibly non-convex) optimization problems with upper and lower bound constraints. When interior methods are applied to the discretized problem, the resulting linear systems hav...
متن کاملApproximating optimization problems over convex functions
Many problems of theoretical and practical interest involve finding an optimum over a family of convex functions. For instance, finding the projection on the convex functions in H(Ω), and optimizing functionals arising from some problems in economics. In the continuous setting and assuming smoothness, the convexity constraints may be given locally by asking the Hessian matrix to be positive sem...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Comp. Opt. and Appl.
دوره 50 شماره
صفحات -
تاریخ انتشار 2011